Supplementary material for : Generalizing from Several Related Classification Tasks to a New Unlabeled Sample

نویسندگان

  • Gilles Blanchard
  • Gyemin Lee
  • Clayton Scott
چکیده

The function k : Ω×Ω→ R is called a kernel on Ω if the matrix (k(xi, xj))1≤i,j≤n is positive semidefinite for all positive integers n and all x1, . . . , xn ∈ Ω. It is well-known that if k is a kernel on Ω, then there exists a Hilbert space H̃ and Φ̃ : Ω→ H̃ such that k(x, x′) = 〈Φ̃(x), Φ̃(x)〉H̃. While H̃ and Φ̃ are not uniquely determined by k, the Hilbert space of functionsHk = {〈v, Φ̃(·)〉H̃ : v ∈ H̃} is uniquely determined by k, and is called the reproducing kernel Hilbert space (RKHS) of k.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Generalizing from Several Related Classification Tasks to a New Unlabeled Sample

We consider the problem of assigning class labels to an unlabeled test data set, given several labeled training data sets drawn from similar distributions. This problem arises in several applications where data distributions fluctuate because of biological, technical, or other sources of variation. We develop a distributionfree, kernel-based approach to the problem. This approach involves ident...

متن کامل

Sample-oriented Domain Adaptation for Image Classification

Image processing is a method to perform some operations on an image, in order to get an enhanced image or to extract some useful information from it. The conventional image processing algorithms cannot perform well in scenarios where the training images (source domain) that are used to learn the model have a different distribution with test images (target domain). Also, many real world applicat...

متن کامل

Deep Unsupervised Domain Adaptation for Image Classification via Low Rank Representation Learning

Domain adaptation is a powerful technique given a wide amount of labeled data from similar attributes in different domains. In real-world applications, there is a huge number of data but almost more of them are unlabeled. It is effective in image classification where it is expensive and time-consuming to obtain adequate label data. We propose a novel method named DALRRL, which consists of deep ...

متن کامل

Constraint-Driven Active Learning Across Tasks

Consider a set of T tasks, each with a response variable Yi, i = 1, 2, . . . , T . Each sample in our training set x ∈ U is associated with these T labels (labels are not necessarily binary). For each sample, we might know a few (or none) of its T labels. We use UL(x) to denote the set of unknown labels on a sample x: UL(x) = {Yi : Yi is unknown for sample x}. Given labeled examples, we can lea...

متن کامل

Supplementary material: Learning and Selecting Features viaPoint-wise Gated Boltzmann Machines

There are many classification tasks where we are given a large number of unlabeled examples in addition to only a few labeled training examples. For such scenario, it is important to include unlabeled examples during the training to generalize well to the unseen data, and thus avoid overfitting. Larochelle and Bengio (2008) proposed the semi-supervised training of the discriminative restricted ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011